142 research outputs found
Times series averaging from a probabilistic interpretation of time-elastic kernel
At the light of regularized dynamic time warping kernels, this paper
reconsider the concept of time elastic centroid (TEC) for a set of time series.
From this perspective, we show first how TEC can easily be addressed as a
preimage problem. Unfortunately this preimage problem is ill-posed, may suffer
from over-fitting especially for long time series and getting a sub-optimal
solution involves heavy computational costs. We then derive two new algorithms
based on a probabilistic interpretation of kernel alignment matrices that
expresses in terms of probabilistic distributions over sets of alignment paths.
The first algorithm is an iterative agglomerative heuristics inspired from the
state of the art DTW barycenter averaging (DBA) algorithm proposed specifically
for the Dynamic Time Warping measure. The second proposed algorithm achieves a
classical averaging of the aligned samples but also implements an averaging of
the time of occurrences of the aligned samples. It exploits a straightforward
progressive agglomerative heuristics. An experimentation that compares for 45
time series datasets classification error rates obtained by first near
neighbors classifiers exploiting a single medoid or centroid estimate to
represent each categories show that: i) centroids based approaches
significantly outperform medoids based approaches, ii) on the considered
experience, the two proposed algorithms outperform the state of the art DBA
algorithm, and iii) the second proposed algorithm that implements an averaging
jointly in the sample space and along the time axes emerges as the most
significantly robust time elastic averaging heuristic with an interesting noise
reduction capability. Index Terms-Time series averaging Time elastic kernel
Dynamic Time Warping Time series clustering and classification
On Recursive Edit Distance Kernels with Application to Time Series Classification
This paper proposes some extensions to the work on kernels dedicated to
string or time series global alignment based on the aggregation of scores
obtained by local alignments. The extensions we propose allow to construct,
from classical recursive definition of elastic distances, recursive edit
distance (or time-warp) kernels that are positive definite if some sufficient
conditions are satisfied. The sufficient conditions we end-up with are original
and weaker than those proposed in earlier works, although a recursive
regularizing term is required to get the proof of the positive definiteness as
a direct consequence of the Haussler's convolution theorem. The classification
experiment we conducted on three classical time warp distances (two of which
being metrics), using Support Vector Machine classifier, leads to conclude
that, when the pairwise distance matrix obtained from the training data is
\textit{far} from definiteness, the positive definite recursive elastic kernels
outperform in general the distance substituting kernels for the classical
elastic distances we have tested.Comment: 14 page
Time Warp Edit Distance
This technical report details a family of time warp distances on the set of
discrete time series. This family is constructed as an editing distance whose
elementary operations apply on linear segments. A specific parameter allows
controlling the stiffness of the elastic matching. It is well suited for the
processing of event data for which each data sample is associated with a
timestamp, not necessarily obtained according to a constant sampling rate. Some
properties verified by these distances are proposed and proved in this report.Comment: Pattern Recognition - Clustering - Algorithms - Similarity Measure
Time Warp Edit Distance with Stiffness Adjustment for Time Series Matching
In a way similar to the string-to-string correction problem we address time
series similarity in the light of a time-series-to-time-series-correction
problem for which the similarity between two time series is measured as the
minimum cost sequence of "edit operations" needed to transform one time series
into another. To define the "edit operations" we use the paradigm of a
graphical editing process and end up with a dynamic programming algorithm that
we call Time Warp Edit Distance (TWED). TWED is slightly different in form from
Dynamic Time Warping, Longest Common Subsequence or Edit Distance with Real
Penalty algorithms. In particular, it highlights a parameter which drives a
kind of stiffness of the elastic measure along the time axis. We show that the
similarity provided by TWED is a metric potentially useful in time series
retrieval applications since it could benefit from the triangular inequality
property to speed up the retrieval process while tuning the parameters of the
elastic measure. In that context, a lower bound is derived to relate the
matching of time series into down sampled representation spaces to the matching
into the original space. Empiric quality of the TWED distance is evaluated on a
simple classification task. Compared to Edit Distance, Dynamic Time Warping,
Longest Common Subsequnce and Edit Distance with Real Penalty, TWED has proven
to be quite effective on the considered experimental task
Down-Sampling coupled to Elastic Kernel Machines for Efficient Recognition of Isolated Gestures
In the field of gestural action recognition, many studies have focused on
dimensionality reduction along the spatial axis, to reduce both the variability
of gestural sequences expressed in the reduced space, and the computational
complexity of their processing. It is noticeable that very few of these methods
have explicitly addressed the dimensionality reduction along the time axis.
This is however a major issue with regard to the use of elastic distances
characterized by a quadratic complexity. To partially fill this apparent gap,
we present in this paper an approach based on temporal down-sampling associated
to elastic kernel machine learning. We experimentally show, on two data sets
that are widely referenced in the domain of human gesture recognition, and very
different in terms of quality of motion capture, that it is possible to
significantly reduce the number of skeleton frames while maintaining a good
recognition rate. The method proves to give satisfactory results at a level
currently reached by state-of-the-art methods on these data sets. The
computational complexity reduction makes this approach eligible for real-time
applications.Comment: ICPR 2014, International Conference on Pattern Recognition, Stockholm
: Sweden (2014
- …